179 research outputs found

    Use of Inferential Statistics to Design Effective Communication Protocols for Wireless Sensor Networks

    Get PDF
    This thesis explores the issues and techniques associated with employing the principles of inferential statistics to design effective Medium Access Control (MAC), routing and duty cycle management strategies for multihop Wireless Sensor Networks (WSNs). The main objective of these protocols are to maximise the throughput of the network, to prolong the lifetime of nodes and to reduce the end-to-end delay of packets over a general network scenario without particular considerations for specific topology configurations, traffic patterns or routing policies. WSNs represent one of the leading-edge technologies that have received substantial research efforts due to their prominent roles in many applications. However, to design effective communication protocols for WSNs is particularly challenging due to the scarce resources of these networks and the requirement for large-scale deployment. The MAC, routing and duty cycle management protocols are amongst the important strategies that are required to ensure correct operations of WSNs. This thesis makes use of the inferential statistics field to design these protocols; inferential statistics was selected as it provides a rich design space with powerful approaches and methods. The MAC protocol proposed in this thesis exploits the statistical characteristics of the Gamma distribution to enable each node to adjust its contention parameters dynamically based on its inference for the channel occupancy. This technique reduces the service time of packets and leverages the throughput by improving the channel utilisation. Reducing the service time minimises the energy consumed in contention to access the channel which in turn prolongs the lifetime of nodes. The proposed duty cycle management scheme uses non-parametric Bayesian inference to enable each node to determine the best times and durations for its sleeping durations without posing overheads on the network. Hence the lifetime of node is prolonged by mitigating the amount of energy wasted in overhearing and idle listening. Prolonging the lifetime of nodes increases the throughput of the network and reduces the end-to-end delay as it allows nodes to route their packets over optimal paths for longer periods. The proposed routing protocol uses one of the state-of-the-art inference techniques dubbed spatial reasoning that enables each node to figure out the spatial relationships between nodes without overwhelming the network with control packets. As a result, the end-to-end delay is reduced while the throughput and lifetime are increased. Besides the proposed protocols, this thesis utilises the analytical aspects of statistics to develop rigorous analytical models that can accurately predict the queuing and medium access delay and energy consumption over multihop networks. Moreover, this thesis provides a broader perspective for design of communication protocols for WSNs by casting the operations of these networks in the domains of the artificial chemistry discipline and the harmony search optimisation algorithm

    Risk Factors of Epistaxis in Primary School Children in Dakahlia Governorate, Egypt

    Get PDF
    Background: Epistaxis is a common pediatric complaint, so we study its risk factors trying to control and decrease it among primary school children. Objectives: To determine frequency of epistaxis and to assess the risk factors of epistaxis in primary school children (8-12 years old). Patients and Methods: A cross-sectional study that was conducted on 312 children. Sample was collected by systematic random technique from the children attending to Family Health Center, Sanafa, Mit-Ghamr city, Dakahlia Governorate, Egypt. Participants’ data regarding socio-demographic characteristics, and risk factors were collected via predesignedquestionnaire. Results: The study revealed that prevalence of epistaxis among the studied group was 32.4% and the most common risk factors among the studied group were smoking, head trauma, factory gases, chronic cough and upper respiratory tract infection (URTI) (56.7%, 45.8%, 35.6%, 32.7% & 30.1% respectively).Conclusion: Recurrent epistaxis can be troublesome and alarming for parents and children. so special attention must be given for it

    Left ventricle segmentation and quantification using deep learning

    Get PDF
    Cardiac MRI is a widely used noninvasive tool that can provide us with an evaluation of cardiac anatomy and function. It can also be used for heart diagnosis. Heart diagnosis through the estimation of physiological heart parameters requires careful segmentation of the left ventricle (LV) from the images of cardiac MRI. Therefore we aim at building a new deep learning method for the automated delineation and quantification of the LV from cine cardiac MRI. Our goal is to reach lower errors for the calculated heart parameters than the previous works by introducing a new deep learning cardiac segmentation method. Our pipeline starts with an accurate LV localization by finding LV cavity center point using a fully convolutional neural network (FCN) model called FCN1. Then, from all heart sections, we extract a region of interest (ROI) that encompasses the LV. A segmentation for the LV cavity and myocardium is performed from the extracted ROIs using FCN called FCN2. The FCN2 model is associated with multiple bottleneck layers and uses less memory footprint than traditional models such as U-net. Furthermore, we introduced a novel loss function called radial loss that works on minimizing the distance between the ground truth LV contours and the predicted contours. After myocardial segmentation, we estimate the functional and mass parameters of the LV. We used the Automated Cardiac Diagnosis Challenge (ACDC-2017) dataset to validate our pipeline, which provided better segmentation, accurate calculation of heart parameters, and produced fewer errors compared to other approaches applied on the same dataset. Additionally, our segmentation approach showed that it can generalize well across different datasets by validating its performance on a locally collected cardiac dataset. To sum up, we propose a novel deep learning framework that we can translate it into a clinical tool for cardiac diagnosis

    IoT-Based Water Quality Assessment System for Industrial Waste WaterHealthcare Perspective

    Full text link
    The environment, especially water, gets polluted due to industrialization and urbanization. Pollution due to industrialization and urbanization has harmful effects on both the environment and the lives on Earth. This polluted water can cause food poisoning, diarrhea, short-term gastrointestinal problems, respiratory diseases, skin problems, and other serious health complications. In a developing country like Bangladesh, where ready-made garments sector is one of the major sources of the total Gross Domestic Product (GDP), most of the wastes released from the garment factories are dumped into the nearest rivers or canals. Hence, the quality of the water of these bodies become very incompatible for the living beings, and so, it has become one of the major threats to the environment and human health. In addition, the amount of fish in the rivers and canals in Bangladesh is decreasing day by day as a result of water pollution. Therefore, to save fish and other water animals and the environment, we need to monitor the quality of the water and find out the reasons for the pollution. Real-time monitoring of the quality of water is vital for controlling water pollution. Most of the approaches for controlling water pollution are mainly biological and lab-based, which takes a lot of time and resources. To address this issue, we developed an Internet of Things (IoT)-based real-time water quality monitoring system, integrated with a mobile application. The proposed system in this research measures some of the most important indexes of water, including the potential of hydrogen (pH), total dissolved solids (TDS), and turbidity, and temperature of water. The proposed system results will be very helpful in saving the environment, and thus, improving the health of living creatures on Earth

    A CAD System for the Early Prediction of Hypertension based on Changes in Cerebral Vasculature

    Get PDF
    © 2019 IEEE. Hypertension is a leading cause for mortality in the US and a significant contributor to many vascular and non vascular diseases. Previous literature reports suggest that specific cerebral vascular alterations precede the onset of hypertension. In this manuscript, we propose a magnetic resonance angiography (MRA)-based computer-aided-diagnosis (CAD) system for the early detection of hypertension. The steps of the proposed CAD system are: 1) preprocessing of the MRA input data to correct the bias resulting from the magnetic field, remove noise effects, reduce contrast non-uniformities, enhance homogeneity using a generalized Gauss-Markov random field (GGMRF), and normalize data to enhance the segmentation process, 2) delineating the cerebral vasculature using a deep 3-D convolutional neural network (CNN) automatically and accurately, 3) extraction of vascular features (cerebrovascular diameters and tortuosity) that are reported to change with the progression of hypertension and constructing the feature vectors, 4) using the feature vectors for classifying input data using a support vector machine (SVM) classifier. We report a 90% classification accuracy in distinguishing between normal and potential hypertensive subjects. These results demonstrate the efficacy of using the proposed vascular features to predict pre-hypertension or hypertension. Clinicians could track the alterations of these vascular features over time for people at risk of developing hypertension for optimal medical management and mitigate adverse events

    Analysis of the Importance of Systolic Blood Pressure Versus Diastolic Blood Pressure in Diagnosing Hypertension: MRA Study.

    Get PDF
    © 2020 IEEE. Hypertension is one of the severest and most common diseases nowadays. It is considered one of the leading contributors to death worldwide. Specialists tend to diagnose hypertension taking into consideration both systolic and diastolic blood pressure (BP) measurements. However, some clinical hypothesis states that under 50 years of age, diastolic may be slightly more predictive of adverse events, while above that age, systolic may be more predictive. The question is should we give more value to systolic BP or diastolic BP when diagnosing diseases such as hypertension? Three different experiments were conducted in this study using magnetic resonance angiography (MRA) data to investigate this question. In each of these experiments, the following methodology was followed: 1) preprocess MRA data to remove noise, bias, or inhomogeneities, 2) segment the cerebral vasculature for each subject using a CNN-based approach, 3) extract vascular features that represent cerebral alterations that precede and accompany the development of hypertension, and 4) finally build feature vectors and classify data into either normotensives or hypertensives based on the cerebral alterations and the blood pressure measurements. The first experiment was conducted on original data set of 342 subjects. While the second and third experiments enlarged the original data set by generating more synthetic samples to make original data set large enough and balanced. Experimental results showed that systolic blood pressure might be more predictive than diastolic blood pressure in diagnosing hypertension with a classification accuracy of 89.3%

    A Novel Framework for Accurate and Non-Invasive Pulmonary Nodule Diagnosis by Integrating Texture and Contour Descriptors

    Get PDF
    An accurate computer aided diagnostic (CAD) system is very significant and critical for early detection of lung cancer. A new framework for lung nodule classification is proposed in this paper using different imaging markers from one computed tomography (CT) scan. Texture and shape features are combined together to show the main discriminative characteristics between malignant and benign pulmonary nodules. 7th-Order Markov Gibbs random field, (MGRF), is implemented to give a good description of the nodule’s appearance by involving the spatial data. A Various-views Marginal Aggregation Curvature Scale Space (MACSS) and the primitive geometrical properties are used to indicate the nodule’s shape complexity. Eventually, all these modeled descriptors are combined using a stacked autoencoder and softmax classifier to give the final diagnosis. Our system has been validated using 727 samples from the Lung Image Database Consortium. Our diagnosis framework’s accuracy, sensitivity, and specificity were 94.63%, 93.86%, 94.78% respectively, showing that our system serves as an important clinical assistive tool

    Novel Approaches For Segmenting Cerebral Vasculature

    Get PDF
    In this chapter, we propose two segmentation approaches that are able to segment cerebral vasculature automatically and accurately. This would potentially help experts in the early analysis and diagnosis of severe diseases, specifically, multiple sclerosis

    Radiomic-based framework for early diagnosis of lung cancer

    Get PDF
    © 2019 IEEE. This paper proposes a new framework for pulmonary nodule diagnosis using radiomic features extracted from a single computed tomography (CT) scan. The proposed framework integrates appearance and shape features to get a precise diagnosis for the extracted lung nodules. The appearance features are modeled using 3D Histogram of Oriented Gradient (HOG) and higher-order Markov Gibbs random field (MGRF) model because of their ability to describe the spatial non-uniformity in the texture of the nodule regardless of its size. The shape features are modeled using Spherical Harmonic expansion and some basic geometric features in order to have a full description of the shape complexity of the nodules. Finally, all the modeled features are fused and fed to a stacked autoencoder to differentiate between the malignant and benign nodules. Our framework is evaluated using 727 nodules which are selected from the Lung Image Database Consortium (LIDC) dataset, and achieved classification accuracy, sensitivity, and specificity of 93.12%, 92.47%, and 93.60% respectively

    A Comprehensive Framework for Accurate Classification of Pulmonary Nodules

    Get PDF
    © 2020 IEEE. A precise computerized lung nodule diagnosis framework is very important for helping radiologists to diagnose lung nodules at an early stage. In this manuscript, a novel system for pulmonary nodule diagnosis, utilizing features extracted from single computed tomography (CT) scans, is proposed. This system combines robust descriptors for both texture and contour features to give a prediction of the nodule\u27s growth rate, which is the standard clinical information for pulmonary nodules diagnosis. Spherical Sector Isosurfaces Histogram of Oriented Gradient is developed to describe the nodule\u27s texture, taking spatial information into account. A Multi-views Peripheral Sum Curvature Scale Space is used to demonstrate the nodule\u27s contour complexity. Finally, the two modeled features are augmented together utilizing a deep neural network to diagnose the nodules malignancy. For the validation purpose, the proposed system utilized 727 nodules from the Lung Image Database Consortium. The proposed system classification accuracy was 94.50%
    corecore